963 research outputs found

    Converting Ontologies into DSLs

    Get PDF
    This paper presents a project whose main objective is to explore the Ontological-based development of Domain Specific Languages (DSL), more precisely, of their underlying Grammar. After reviewing the basic concepts characterizing Ontologies and Domain-Specific Languages, we introduce a tool, Onto2Gra, that takes profit of the knowledge described by the ontology and automatically generates a grammar for a DSL that allows to discourse about the domain described by that ontology. This approach represents a rigorous method to create, in a secure and effective way, a grammar for a new specialized language restricted to a concrete domain. The usual process of creating a grammar from the scratch is, as every creative action, difficult, slow and error prone; so this proposal is, from a Grammar Engineering point of view, of uttermost importance. After the grammar generation phase, the Grammar Engineer can manipulate it to add syntactic sugar to improve the final language quality or even to add semantic actions. The Onto2Gra project is composed of three engines. The main one is OWL2DSL, the component that converts an OWL ontology into an attribute grammar. The two additional modules are Onto2OWL, converts ontologies written in OntoDL (a light-weight DSL to describe ontologies) into standard OWL, and DDesc2OWL, converts domain instances written in the DSL generated by OWL2DSL into the initial OWL ontology

    Urban Evolution of Fafe in the Last Two Centuries

    Get PDF
    Human Beings love to collect, store and preserve documents for later exploration leading to the creation of Archives. Actually, to consult municipal archives\u27 asset, seeking information in order to explore the knowledge implicit in their documents, is the main reason for the existence of those memory institutions. On the other hand, it is known that the movement of people from dispersed living to concentration in urban environments has a strong impact both in human civilization and in the environment. This statement motivates Social Science researchers to study of urban evolution of cities. In this context and having noticed that Fafe\u27s Archive holds an important collection of municipal records (since XIX Century) concerning the application for authorization to construct or reconstruct private or public buildings, it came up to our minds to create a digital repository with those documents enabling their analysis. An information system shall be developed around it for information retrieval and knowledge exploration; it is also desirable that this application provides features to visualize the information extracted in convenient ways, like positioning buildings over a map. This paper discusses the development of the referred Web-based system to study the Urban Evolution of Fafe in the XIX and XX Centuries, focussing on the ontology created to understand the domain to be explored. The definition of a markup language (as a XML dialect), to annotate the Archive documents in order to enable the automatic data extraction and the semantic search, is also one of the paper topics. It will be discussed that this annotation was not defined from the scratch; instead, its design followed the ontology. It is actually an ontology-driven system. At last, the state of the Web interface (the system front-end) so far developed will be presented

    Digital Collection Creator, Visualizer and Explorer

    Get PDF
    In this paper we introduce and discuss a recent project, called CortaColaEspia, aimed at extending with some extra relevant features the \u27Ontology-based Collection Processor\u27 developed previously in the context of a Compilers course. The basic processor, based on the OntoDL tool, was able to read the ontological description of a small collection of objects (cards, pencils, toys, etc.) and produce automatically a web-based exhibition space to display the objects, providing a conceptual navigation through them. The extension under discussion is intended to create a new DSL to describe the details of the exhibition room organization (what concepts and relations to show; where and how to show them; etc.). A second objective consists of a new module to merge two collections, or to enrich a collection with extra information about the collected objects. The last requirement is the incorporation of a natural language processor to analyze the objects\u27 captions or short inscriptions in order to extract information that can create knowledge about a specific domain, a society or an epoch

    DOLPHIN-FEW - An example of a Web system to analyze and study compilers behavior

    Get PDF
    DOLPHIN is a framework conceived to develop and test compiler components. DOLPHIN-FEW (Front-End for the Web) is the DOLPHIN module that integrates all Web-related functionalities. Initially conceived to monitor the behavior of some routines of the compilers back-end, it is, nowadays, also usable as a visual tool to teach how those code analysis, optimization, and code generation routines work. This paper introduces DOLPHIN-FEW, a software system that takes advantage of the web environment and associated technologies to be a powerful pedagogical tool to teach compiler construction topics

    Automatic Test Generation for Space

    Get PDF
    The European Space Agency (ESA) uses an engine to perform tests in the Ground Segment infrastructure, specially the Operational Simulator. This engine uses many different tools to ensure the development of regression testing infrastructure and these tests perform black-box testing to the C++ simulator implementation. VST (VisionSpace Technologies) is one of the companies that provides these services to ESA and they need a tool to infer automatically tests from the existing C++ code, instead of writing manually scripts to perform tests. With this motivation in mind, this paper explores automatic testing approaches and tools in order to propose a system that satisfies VST needs

    Applying compiler technology to solve generic

    Get PDF
    Compilers are tools that transform a high level programming languages into assem- bly or binary code. The essential of the process is done by the interpretation and the code generation steps, but nowadays most compilers have also a strong component of code optimization, that explore as much as possible the potential of the computer architectures to which the compiler must generate the code. These optimizations are based on the information provided by several analysis processes. This paper present some of these code analysis and optimizations, and shows how they can be used to solve problems or improve the quality of solutions used at areas such as industrial engineer and planning

    Data flow analysis applied to optimize generic workflow problems

    Get PDF
    The compiler process, the one that transforms a program in a high level language into assembly or binary code, is a much elaborated process that mixes several powerful technologies, some of them developed specifically for this area. Nowadays, compilers are highly developed systems that can analyze and improve quite efficiently the source code, profiting from all the potential of the new processor architectures. This paper introduces a common type of analysis - the Data Flow Analysis – that is used to compute flow-sensitive information about programs, whose results are essential to produce many code optimizations. It is also argued that the problem of analyzing the data flow in software programs has many similarities with the problems found in industrial engineering; planning and management. As consequence, it is possible to apply analysis and optimization techniques used by compilers in these areas

    NetLangEd, A Web Editor to Support Online Comment Annotation

    Get PDF
    This paper focuses on the scientific areas of Digital Humanities, Social Networks and Inappropriate Social Discourse. The main objective of this research project is the development of an editor that allows researchers in the human and social sciences or psychologists to add their reflections or ideas out coming from reading and analyzing posts and comments of an online corpus . In the present context, the editor is being integrated with the analysis tools available in the NetLang platform. NetLangEd, in addition to allowing the three basic operations of adding, editing and removing annotations, will also offer mechanisms to manage, organize, view and locate annotations, all of which will be performed in an easy, fast and user-friendly way

    Type Annotation for SAST

    Get PDF

    Visualization/animation of programs in Alma: obtaining different results

    Get PDF
    Alma, a system for program animation, receives as input a computer program and produces a sequence of visualizations that will describe its functionality. The system generates automatically program animations basing this process on the internal representation of those programs. The back-end of this system works over at? execution tree (DAST Decorated Abstract Syntax Tree), implementing the animation algorithm. This algorithm uses two bases of rules: visualizing rules (to associate graphical representation with program elements creating a visual description of the program state) and rewriting rules (to change the program state). In this paper the main goal will be to present the extensibility of the system in the sense of adding or modifying inputs and outputs. We also discuss the characteristics of Alma's architecture that make this possible.FC
    • …
    corecore